171 research outputs found

    Oil Spill Detection Analyzing “Sentinel 2“ Satellite Images: A Persian Gulf Case Study

    Get PDF
    Oil spills near exploitation areas and oil loading ports are often related to the ambitions of governments to get more oil market share and the negligence at the time of the loading in large tankers or ships. The present study investigates one oil spill event using multi sensor satellite images in the Al Khafji (between Kuwait and Saudi Arabia) zone. Oil slicks have been characterized with multi sensor satellite images over the Persian Gulf and then analyzed in order to detect and classify oil spills in this zone. In particular this paper discusses oil pollution detection in the Persian Gulf by using multi sensor satellite images data. Oil spill images have been selected by using Sentinel 2 images pinpointing oil spill zones. ENVI software for analysing satellite images and ADIOS (Automated Data Inquiry for Oil Spills) for oil weathering modelling have been used. The obtained results in Al Khafji zone show that the oil spill moves towards the coastline firstly increasing its surface and then decreasing it until reaching the coastline

    An optimized ultrasound detector for photoacoustic breast tomography

    Get PDF
    Photoacoustic imaging has proven to be able to detect vascularization-driven optical absorption contrast associated with tumors. In order to detect breast tumors located a few centimeter deep in tissue, a sensitive ultrasound detector is of crucial importance for photoacoustic mammography. Further, because the expected photoacoustic frequency bandwidth (a few MHz to tens of kHz) is inversely proportional to the dimensions of light absorbing structures (0.5 to 10+ mm), proper choices of materials and their geometries, and proper considerations in design have to be made for optimal photoacoustic detectors. In this study, we design and evaluate a specialized ultrasound detector for photoacoustic mammography. Based on the required detector sensitivity and its frequency response, a selection of active material and matching layers and their geometries is made leading to a functional detector models. By iteration between simulation of detector performances, fabrication and experimental characterization of functional models an optimized implementation is made and evaluated. The experimental results of the designed first and second functional detectors matched with the simulations. In subsequent bare piezoelectric samples the effect of lateral resonances was addressed and their influence minimized by sub-dicing the samples. Consequently, using simulations, the final optimized detector could be designed, with a center frequency of 1 MHz and a -6 dB bandwidth of ~80%. The minimum detectable pressure was measured to be 0.5 Pa, which will facilitate deeper imaging compared to the currrent systems. The detector should be capable of detecting vascularized tumors with resolution of 1-2 mm. Further improvements by proper electrical grounding and shielding and implementation of this design into an arrayed detector will pave the way for clinical applications of photoacoustic mammography.Comment: Accepted for publication in Medical Physics (American Association of Physicists in Medicine

    Passive element enriched photoacoustic computed tomography (PER PACT) for simultaneous imaging of acoustic propagation properties and light absorption\ud

    Get PDF
    We present a ‘hybrid’ imaging approach which can image both light absorption properties and acoustic transmission properties of an object in a two-dimensional slice using a computed tomography (CT) photoacoustic imager. The ultrasound transmission measurement method uses a strong optical absorber of small cross-section placed in the path of the light illuminating the sample. This absorber, which we call a passive element acts as a source of ultrasound. The interaction of ultrasound with the sample can be measured in transmission, using the same ultrasound detector used for photoacoustics. Such measurements are made at various angles around the sample in a CT approach. Images of the ultrasound propagation parameters, attenuation and speed of sound, can be reconstructed by inversion of a measurement model. We validate the method on specially designed phantoms and biological specimens. The obtained images are quantitative in terms of the shape, size, location, and acoustic properties of the examined heterogeneitie

    Design thinking and acceptance requirements for designing gamified software.

    Get PDF
    Gamification is increasingly applied to engage people in performing tool-supported collaborative tasks. From previous experiences we learned that available gamification guidelines are not sufficient, and more importantly that motivational and acceptance aspects need to be considered when designing gamified software applications. To understand them, stakeholders need to be involved in the design process. This paper aims to (i) identify key requirements for designing gamified solutions, and (ii) understand if existing methods (partially fitting those requirements) can be selected and combined to provide a comprehensive gamification design method. We discuss a set of key requirements for a suitable gamification design method. We illustrate how to select and combine existing methods to define a design approach that fits those requirements using Design Thinking and the Agon framework. Furthermore, we present a first empirical evaluation of the integrated design method, with participants including both requirements analysts and end-users of the gamified software. Our evaluation offers initial ideas towards a more general, systematic approach for gamification design

    Adversarial Attacks Against Uncertainty Quantification

    Full text link
    Machine-learning models can be fooled by adversarial examples, i.e., carefully-crafted input perturbations that force models to output wrong predictions. While uncertainty quantification has been recently proposed to detect adversarial inputs, under the assumption that such attacks exhibit a higher prediction uncertainty than pristine data, it has been shown that adaptive attacks specifically aimed at reducing also the uncertainty estimate can easily bypass this defense mechanism. In this work, we focus on a different adversarial scenario in which the attacker is still interested in manipulating the uncertainty estimate, but regardless of the correctness of the prediction; in particular, the goal is to undermine the use of machine-learning models when their outputs are consumed by a downstream module or by a human operator. Following such direction, we: \textit{(i)} design a threat model for attacks targeting uncertainty quantification; \textit{(ii)} devise different attack strategies on conceptually different UQ techniques spanning for both classification and semantic segmentation problems; \textit{(iii)} conduct a first complete and extensive analysis to compare the differences between some of the most employed UQ approaches under attack. Our extensive experimental analysis shows that our attacks are more effective in manipulating uncertainty quantification measures than attacks aimed to also induce misclassifications

    Shelf life of fresh air packaged and precooked vacuum packaged quails

    Get PDF
    The shelf-life of 3 batches (Q1, Q2, Q3) of quail meat, were examined. Q1 were cut and seasoned with commercial olive oil, stoned green olive and sliced bacon. Q2 were divided into two subgroups: Q2.1 produced in the previously described conditions; Q2.2 seasoned also with rosemary. Quails were placed in lowdensity polystirene barrier trays and aerobically packaged. Q3 quails were boiled in salted hot water for 40 min, seasoned with myrtle leafs, placed in low density polyethylene bags and vacuum packaged. All samples were stored at +2 and +7°C. Analysis were conducted at 0, 3, 7, 9 and 14 days (T0, T3, T7, T9, and T14, respectively). For all the samples, pH measurement and microbial analysis [total viable count (TVC), Enterobacteriaceae, E. coli, Lactobacillus spp. (LAB), Pseudomonas spp., Brochothrix thermosphacta, coagulase-negative Staphylococci (CNS), Enterococcus spp., yeasts and moulds, Salmonella spp., Listeria monocytogenes] were performed. Initial TVC levels of fresh quails (ca. 4 log CFU/g) were rather high and this may be due to the microbial population of the raw material. In Q1 and Q2.1 samples, TVC reached the value of 7 log, which is considered as the upper acceptability limit for fresh poultry meat (after T9 under storage at +2°C and after T7 at +7°C). In Q2.2 samples such limit was reached earlier, after T3. In Q3 samples, lower TVC levels were recorded and did not reach the above mentioned limit, not even at the end of storage. However, mean counts >5 log were reached, maybe because of a post-cooking cross-contamination. Salmonella spp. prevalence was 33% in Q1, Q2.1 and Q2.2 samples

    an identification and a prioritisation of geographic and temporal data gaps of mediterranean marine databases

    Get PDF
    Abstract Getting an overall view of primary data available from existing Earth Observation Systems and networks databases for the Mediterranean Sea, the main objective of this paper is to identify temporal and geographic data gaps and to elaborate a new method for providing a prioritisation of missing data useful for end-users that have to pinpoint strategies and models to fill these gaps. Existing data sources have been identified from the analysis of the main projects and information systems available. A new method to perform the data gap analysis has been developed and applied to the whole Mediterranean basin as case study area, identifying and prioritise geographical and temporal data gaps considering and integrating the biological, geological, chemical and physical branches of the total environment. The obtained results highlighted both the main geographical data gaps subdividing the whole Mediterranean Sea into 23 sub-basins and the temporal data gaps considering data gathered since 1990. Particular attention has been directed to the suitability of data in terms of completeness, accessibility and aggregation, since data and information are often aggregated and could not be used for research needs. The elaborated inventory of existing data source includes a database of 477 data rows originated from 122 data platforms analysed, able to specify for each dataset the related data typologies and its accessibility. The obtained results indicate that 76% of the data comes from ongoing platforms, while the remaining 25% are related to platforms with non-operational monitoring systems. Since the large amount of analysed records includes data gathered in inhomogeneous ways, the prioritisation values obtained for each identified data gap simplify the data comparison and analysis. Lastly, the data gaps inventory contains geographic and temporal information for any missing parameter at the whole basin scale, as well as the spatial resolution of each available data
    • …
    corecore